15 research outputs found

    Approximating nonlinear functions with latent boundaries in low-rank excitatory-inhibitory spiking networks

    Full text link
    Deep feedforward and recurrent rate-based neural networks have become successful functional models of the brain, but they neglect obvious biological details such as spikes and Dale's law. Here we argue that these details are crucial in order to understand how real neural circuits operate. Towards this aim, we put forth a new framework for spike-based computation in low-rank excitatory-inhibitory spiking networks. By considering populations with rank-1 connectivity, we cast each neuron's spiking threshold as a boundary in a low-dimensional input-output space. We then show how the combined thresholds of a population of inhibitory neurons form a stable boundary in this space, and those of a population of excitatory neurons form an unstable boundary. Combining the two boundaries results in a rank-2 excitatory-inhibitory (EI) network with inhibition-stabilized dynamics at the intersection of the two boundaries. The computation of the resulting networks can be understood as the difference of two convex functions, and is thereby capable of approximating arbitrary non-linear input-output mappings. We demonstrate several properties of these networks, including noise suppression and amplification, irregular activity and synaptic balance, as well as how they relate to rate network dynamics in the limit that the boundary becomes soft. Finally, while our work focuses on small networks (5-50 neurons), we discuss potential avenues for scaling up to much larger networks. Overall, our work proposes a new perspective on spiking networks that may serve as a starting point for a mechanistic understanding of biological spike-based computation

    Nonlinear computations in spiking neural networks through multiplicative synapses

    Get PDF
    The brain efficiently performs nonlinear computations through its intricate networks of spiking neurons, but how this is done remains elusive. While nonlinear computations can be implemented successfully in spiking neural networks, this requires supervised training and the resulting connectivity can be hard to interpret. In contrast, the required connectivity for any computation in the form of a linear dynamical system can be directly derived and understood with the spike coding network (SCN) framework. These networks also have biologically realistic activity patterns and are highly robust to cell death. Here we extend the SCN framework to directly implement any polynomial dynamical system, without the need for training. This results in networks requiring a mix of synapse types (fast, slow, and multiplicative), which we term multiplicative spike coding networks (mSCNs). Using mSCNs, we demonstrate how to directly derive the required connectivity for several nonlinear dynamical systems. We also show how to carry out higher-order polynomials with coupled networks that use only pair-wise multiplicative synapses, and provide expected numbers of connections for each synapse type. Overall, our work demonstrates a novel method for implementing nonlinear computations in spiking neural networks, while keeping the attractive features of standard SCNs (robustness, realistic activity patterns, and interpretable connectivity). Finally, we discuss the biological plausibility of our approach, and how the high accuracy and robustness of the approach may be of interest for neuromorphic computing

    Nonlinear computations in spiking neural networks through multiplicative synapses

    Get PDF
    The brain efficiently performs nonlinear computations through its intricate networks of spiking neurons, but how this is done remains elusive. While nonlinear computations can be implemented successfully in spiking neural networks, this requires supervised training and the resulting connectivity can be hard to interpret. In contrast, the required connectivity for any computation in the form of a linear dynamical system can be directly derived and understood with the spike coding network (SCN) framework. These networks also have biologically realistic activity patterns and are highly robust to cell death. Here we extend the SCN framework to directly implement any polynomial dynamical system, without the need for training. This results in networks requiring a mix of synapse types (fast, slow, and multiplicative), which we term multiplicative spike coding networks (mSCNs). Using mSCNs, we demonstrate how to directly derive the required connectivity for several nonlinear dynamical systems. We also show how to carry out higher-order polynomials with coupled networks that use only pair-wise multiplicative synapses, and provide expected numbers of connections for each synapse type. Overall, our work demonstrates a novel method for implementing nonlinear computations in spiking neural networks, while keeping the attractive features of standard SCNs (robustness, realistic activity patterns, and interpretable connectivity). Finally, we discuss the biological plausibility of our approach, and how the high accuracy and robustness of the approach may be of interest for neuromorphic computing.Comment: This article has been peer-reviewed and recommended by Peer Community In Neuroscienc

    Training deep neural density estimators to identify mechanistic models of neural dynamics

    Get PDF
    Mechanistic modeling in neuroscience aims to explain observed phenomena in terms of underlying causes. However, determining which model parameters agree with complex and stochastic neural data presents a significant challenge. We address this challenge with a machine learning tool which uses deep neural density estimators—trained using model simulations—to carry out Bayesian inference and retrieve the full space of parameters compatible with raw data or selected data features. Our method is scalable in parameters and data features and can rapidly analyze new data after initial training. We demonstrate the power and flexibility of our approach on receptive fields, ion channels, and Hodgkin–Huxley models. We also characterize the space of circuit configurations giving rise to rhythmic activity in the crustacean stomatogastric ganglion, and use these results to derive hypotheses for underlying compensation mechanisms. Our approach will help close the gap between data-driven and theory-driven models of neural dynamics

    25th annual computational neuroscience meeting: CNS-2016

    Get PDF
    The same neuron may play different functional roles in the neural circuits to which it belongs. For example, neurons in the Tritonia pedal ganglia may participate in variable phases of the swim motor rhythms [1]. While such neuronal functional variability is likely to play a major role the delivery of the functionality of neural systems, it is difficult to study it in most nervous systems. We work on the pyloric rhythm network of the crustacean stomatogastric ganglion (STG) [2]. Typically network models of the STG treat neurons of the same functional type as a single model neuron (e.g. PD neurons), assuming the same conductance parameters for these neurons and implying their synchronous firing [3, 4]. However, simultaneous recording of PD neurons shows differences between the timings of spikes of these neurons. This may indicate functional variability of these neurons. Here we modelled separately the two PD neurons of the STG in a multi-neuron model of the pyloric network. Our neuron models comply with known correlations between conductance parameters of ionic currents. Our results reproduce the experimental finding of increasing spike time distance between spikes originating from the two model PD neurons during their synchronised burst phase. The PD neuron with the larger calcium conductance generates its spikes before the other PD neuron. Larger potassium conductance values in the follower neuron imply longer delays between spikes, see Fig. 17.Neuromodulators change the conductance parameters of neurons and maintain the ratios of these parameters [5]. Our results show that such changes may shift the individual contribution of two PD neurons to the PD-phase of the pyloric rhythm altering their functionality within this rhythm. Our work paves the way towards an accessible experimental and computational framework for the analysis of the mechanisms and impact of functional variability of neurons within the neural circuits to which they belong

    Channels and circuits: biophysical and network models of neuronal function

    No full text
    Neuroscience research studies the brain at various different levels of detail. Experimental work explores everything from the molecular machinery within each neuron, to the behavioral output of an organism. Similarly, computational models are designed to operate at many scales, and can address different questions depending upon the complexity of the description. Detailed biophysical modelling incorporates findings about neuronal physiology and structure in order to test hypotheses and generate predictions. However, due to the complexity and number of modelling studies, it is difficult to compare models and to assess biological fidelity. In this work we present a curated database of nearly 3000 voltage-gated ion channel models used in published neuronal simulations, called ICGenealogy. Furthermore, we present a standardized formulation for ion channel dynamics which we fit to all models on this database. Through these two endeavors, we facilitate better experimentally-constrained modelling, while also providing insight into the diversity and complexity of ion channel dynamics seen in single neurons. At a higher level, neuronal network models integrate findings about neuronal activity and cell types in order to explain representations and computations. Here, we present a model of context-dependent associative memory, which incorporates known principles of memory function from psychology research and proposes concrete functional roles for different components of neural circuits. Through analytics and simulations, we show that a contextual memory system not only provides benefits for memory capacity and robustness, but also enables control of memory expression. This work provides new conceptual ideas for memory research, suggests functional roles for different inhibitory cell types, and may help us to understand the interaction of different memory systems in the brain. Together, the work presented here spans two distinct levels of detail and addresses current challenges both in biophysically detailed models, as well as computation in abstract network models.</p

    Channels and circuits: biophysical and network models of neuronal function

    No full text
    Neuroscience research studies the brain at various different levels of detail. Experimental work explores everything from the molecular machinery within each neuron, to the behavioral output of an organism. Similarly, computational models are designed to operate at many scales, and can address different questions depending upon the complexity of the description. Detailed biophysical modelling incorporates findings about neuronal physiology and structure in order to test hypotheses and generate predictions. However, due to the complexity and number of modelling studies, it is difficult to compare models and to assess biological fidelity. In this work we present a curated database of nearly 3000 voltage-gated ion channel models used in published neuronal simulations, called ICGenealogy. Furthermore, we present a standardized formulation for ion channel dynamics which we fit to all models on this database. Through these two endeavors, we facilitate better experimentally-constrained modelling, while also providing insight into the diversity and complexity of ion channel dynamics seen in single neurons. At a higher level, neuronal network models integrate findings about neuronal activity and cell types in order to explain representations and computations. Here, we present a model of context-dependent associative memory, which incorporates known principles of memory function from psychology research and proposes concrete functional roles for different components of neural circuits. Through analytics and simulations, we show that a contextual memory system not only provides benefits for memory capacity and robustness, but also enables control of memory expression. This work provides new conceptual ideas for memory research, suggests functional roles for different inhibitory cell types, and may help us to understand the interaction of different memory systems in the brain. Together, the work presented here spans two distinct levels of detail and addresses current challenges both in biophysically detailed models, as well as computation in abstract network models.</p

    Context-modular memory networks support high-capacity, flexible, and robust associative memories

    No full text
    Context, such as behavioral state, is known to modulate memory formation and retrieval, but is usually ignored in associative memory models. Here, we propose several types of contextual modulation for associative memory networks that greatly increase their performance. In these networks, context inactivates specific neurons and connections, which modulates the effective connectivity of the network. Memories are stored only by the active components, thereby reducing interference from memories acquired in other contexts. Such networks exhibit several beneficial characteristics, including enhanced memory capacity, high robustness to noise, increased robustness to memory overloading, and better memory retention during continual learning. Furthermore, memories can be biased to have different relative strengths, or even gated on or off, according to contextual cues, providing a candidate model for cognitive control of memory and efficient memory search. An external context-encoding network can dynamically switch the memory network to a desired state, which we liken to experimentally observed contextual signals in prefrontal cortex and hippocampus. Overall, our work illustrates the benefits of organizing memory around context, and provides an important link between behavioral studies of memory and mechanistic details of neural circuits
    corecore